|
In mathematics, a self-adjoint operator on a complex vector space ''V'' with inner product is an operator (a linear map ''A'' from ''V'' to itself) that is its own adjoint: . If ''V'' is finite-dimensional with a given orthonormal basis, this is equivalent to the condition that the matrix of ''A'' is Hermitian, i.e., equal to its conjugate transpose ''A'' *. By the finite-dimensional spectral theorem, ''V'' has an orthonormal basis such that the matrix of ''A'' relative to this basis is a diagonal matrix with entries in the real numbers. In this article, generalizations of this concept are considered to operators on Hilbert spaces of arbitrary dimension. Self-adjoint operators are used in functional analysis and quantum mechanics. In quantum mechanics their importance lies in the Dirac–von Neumann formulation of quantum mechanics, in which physical observables such as position, momentum, angular momentum and spin are represented by self-adjoint operators on a Hilbert space. Of particular significance is the Hamiltonian : which as an observable corresponds to the total energy of a particle of mass ''m'' in a real potential field ''V''. Differential operators are an important class of unbounded operators. The structure of self-adjoint operators on infinite-dimensional Hilbert spaces essentially resembles the finite-dimensional case. That is to say, operators are self-adjoint if and only if they are unitarily equivalent to real-valued multiplication operators. With suitable modifications, this result can be extended to possibly unbounded operators on infinite-dimensional spaces. Since an everywhere-defined self-adjoint operator is necessarily bounded, one needs be more attentive to the domain issue in the unbounded case. This is explained below in more detail. ==Symmetric operators == A linear operator ''A'' on a Hilbert space ''H'' is called symmetric if (bracket notation) : for all elements ''x'' and ''y'' in the domain of ''A''. Sometimes, such an operator is only called symmetric if it is also densely defined. More generally, a partially defined linear operator ''A'' from a topological vector space ''E'' into its continuous dual space ''E''∗ is said to be symmetric if : for all elements ''x'' and ''y'' in the domain of ''A''. This usage is fairly standard in the functional analysis literature. A symmetric ''everywhere-defined'' operator is self-adjoint (see below for definition). By the Hellinger-Toeplitz theorem, a symmetric ''everywhere-defined'' operator is also bounded. In the physics literature, the term Hermitian is used in place of the term symmetric. It should be noted, however, that the physics literature generally glosses over the distinction between operators that are merely symmetric and operators that are actually self-adjoint (as defined in the next section). The previous definition agrees with the one for matrices given in the introduction to this article, if we take as ''H'' the Hilbert space C''n'' with the standard dot product and interpret a square matrix as a linear operator on this Hilbert space. It is however much more general as there are important infinite-dimensional Hilbert spaces. The spectrum of any bounded symmetric operator is real; in particular all its eigenvalues are real, although a symmetric operator may have no eigenvalues. A general version of the spectral theorem which also applies to bounded symmetric operators (see Reed and Simon, vol. 1, chapter VII, or other books cited) is stated below. If the set of eigenvalues for a symmetric operator is non empty, and the eigenvalues are nondegenerate, then it follows from the definition that eigenvectors corresponding to distinct eigenvalues are orthogonal. Contrary to what is sometimes claimed in introductory physics textbooks, it is possible for symmetric operators to have no eigenvalues at all (although the spectrum of any self-adjoint operator is nonempty). The example below illustrates a special case when an (unbounded) symmetric operator does have a set of eigenvectors which constitute a Hilbert space basis. The operator ''A'' below can be seen to have a compact inverse, meaning that the corresponding differential equation ''Af'' = ''g'' is solved by some integral, therefore compact, operator ''G''. The compact symmetric operator ''G'' then has a countable family of eigenvectors which are complete in . The same can then be said for ''A''. Example. Consider the complex Hilbert space L2() and the differential operator : defined on the subspace consisting of all complex-valued infinitely differentiable functions ''f'' on (1 ) with the boundary conditions ''f''(0) = ''f''(1) = 0. Then integration by parts shows that ''A'' is symmetric. Its eigenfunctions are the sinusoids : with the real eigenvalues ''n''2π2; the well-known orthogonality of the sine functions follows as a consequence of the property of being symmetric. We consider generalizations of this operator below. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「selfadjoint operator」の詳細全文を読む スポンサード リンク
|